1,011 research outputs found

    The Information Processing Role of the Amygdala in Emotion

    Get PDF

    Recognition Memory for Faces and Scenes

    Get PDF
    Previous studies have suggested that face memory is unique; however, evidence is inconclusive. To further explore this issue, we investigated recognition memory for unfamiliar faces and scenes. Participants (n = 123) intentionally memorized the stimuli and then engaged in recognition tests. Recognition was measured following short (20 minutes) and long (3 weeks) retention intervals. Encoding strategies and intelligence were also measured. Recognition memory performance for faces was higher than that for scenes at both short and long intervals; however, the effect of retention interval was different between faces and scenes. A relationship between encoding strategies and memory performance was found for scenes but not for faces. The relationship between intelligence and memory performance also differed between faces and scenes. These results suggest that memory for faces is more robust and uses different cognitive mechanisms than memory for scenes

    Facial feedback affects valence judgments of dynamic and static emotional expressions

    Get PDF
    The ability to judge others' emotions is required for the establishment and maintenance of smooth interactions in a community. Several lines of evidence suggest that the attribution of meaning to a face is influenced by the facial actions produced by an observer during the observation of a face. However, empirical studies testing causal relationships between observers' facial actions and emotion judgments have reported mixed findings. This issue was investigated by measuring emotion judgments in terms of valence and arousal dimensions while comparing dynamic vs. static presentations of facial expressions. We presented pictures and videos of facial expressions of anger and happiness. Participants (N = 36) were asked to differentiate between the gender of faces by activating the corrugator supercilii muscle (brow lowering) and zygomaticus major muscle (cheek raising). They were also asked to evaluate the internal states of the stimuli using the affect grid while maintaining the facial action until they finished responding. The cheek raising condition increased the attributed valence scores compared with the brow-lowering condition. This effect of facial actions was observed for static as well as for dynamic facial expressions. These data suggest that facial feedback mechanisms contribute to the judgment of the valence of emotional facial expressions

    The atypical social brain network in autism: Advances in structural and functional MRI studies

    Get PDF
    Purpose of review: To review advances in structural and functional MRI studies regarding the neural underpinnings of social atypicalities in autism spectrum disorder (ASD). Recent findings: According to the hypothesis that the social brain network, which includes brain regions, such as the amygdala and superior temporal sulcus, may be atypical in ASD, recent structural MRI studies have identified regional gray matter volume abnormalities in the social brain regions in ASD groups compared with the typically developing groups. Studies evaluating gray matter volume covariance and white matter volume/integrity suggested network-level abnormalities associated with the social brain regions. Recent functional MRI studies assessing resting-state neural activity showed reduced functional connectivity among the social brain regions in individuals with ASD compared with typically developing groups. Similarly, task-based functional MRI studies recently revealed a reduction in regional activity and intraregional functional coupling in the social brain regions during the processing of social stimuli in individuals with ASD. Summary: These structural and functional MRI studies provide supportive evidence for the hypothesis that an atypical social brain network underlies behavioral social problems in ASD

    Model Building of Metal Oxide Surfaces and Vibronic Coupling Density as a Reactivity Index: Regioselectivity of CO2_2 Adsorption on Ag-loaded Ga2_2O3_3

    Get PDF
    The step-by-step hydrogen-terminated (SSHT) model is proposed as a model for the surfaces of metal oxides. Using this model, it is found that the vibronic coupling density (VCD) can be employed as a reactivity index for surface reactions. As an example, the regioselectivity of CO2_2 adsorption on the Ag-loaded Ga2_2O3_3 photocatalyst surface is investigated based on VCD analysis. The cluster model constructed by the SSHT approach reasonably reflects the electronic structures of the Ga2_2O3_3 surface. The geometry of CO2_2 adsorbed on the Ag-loaded Ga2_2O3_3 cluster has a bent structure, which is favorable for its photocatalytic reduction to CO.Comment: 18 pages, 11 figure

    Reduced representational momentum for subtle dynamic facial expressions in individuals with autism spectrum disorders

    Get PDF
    The cognitive mechanisms underlying social communication via emotional facial expressions are crucial for understanding the social impairments experienced by people with autism spectrum disorders (ASD). A recent study (Yoshikawa & Sato, 2008) found that typically developing individuals perceived the last image from a dynamic facial expression to be more emotionally exaggerated than a static facial expression; this perceptual difference is termed representational momentum (RM) for dynamic facial expressions. RM for dynamic facial expressions might be useful for detecting emotion in another's face and for predicting behavior changes. We examined RM for dynamic facial expressions using facial expression stimuli at three levels of emotional intensity (subtle, medium, and extreme) in people with ASD. We predicted that individuals with ASD would show reduced RM for dynamic facial expressions. Eleven individuals with ASD (three with Asperger's disorder and eight with pervasive developmental disorder not otherwise specified) and 11 IQ-, age- and gender-matched typically developing controls participated in this study. Participants were asked to select an image that matched the final image from dynamic and static facial expressions. Our results revealed that subjectively perceived images were more exaggerated for the dynamic than for the static presentation under all levels of intensity and in both groups. The ASD group, however, perceived a reduced degree of exaggeration for dynamic facial expressions under the subtle intensity condition. As facial expressions are often displayed subtly in daily communications, reduced RM for subtle dynamic facial expressions may prevent individuals with ASD from appropriately interacting with other people as a consequence of their difficulty detecting others’ emotions

    Spatiotemporal neural network dynamics for the processing of dynamic facial expressions.

    Get PDF
    表情を処理する神経ネットワークの時空間ダイナミクスを解明. 京都大学プレスリリース. 2015-07-24.The dynamic facial expressions of emotion automatically elicit multifaceted psychological activities; however, the temporal profiles and dynamic interaction patterns of brain activities remain unknown. We investigated these issues using magnetoencephalography. Participants passively observed dynamic facial expressions of fear and happiness, or dynamic mosaics. Source-reconstruction analyses utilizing functional magnetic-resonance imaging data revealed higher activation in broad regions of the bilateral occipital and temporal cortices in response to dynamic facial expressions than in response to dynamic mosaics at 150-200 ms and some later time points. The right inferior frontal gyrus exhibited higher activity for dynamic faces versus mosaics at 300-350 ms. Dynamic causal-modeling analyses revealed that dynamic faces activated the dual visual routes and visual-motor route. Superior influences of feedforward and feedback connections were identified before and after 200 ms, respectively. These results indicate that hierarchical, bidirectional neural network dynamics within a few hundred milliseconds implement the processing of dynamic facial expressions

    Common and unique impairments in facial-expression recognition in pervasive developmental disorder-not otherwise specified and Asperger's disorder

    Get PDF
    This study was designed to identify specific difficulties and associated features related to the problems with social interaction experienced by individuals with pervasive developmental disorder-not otherwise specified (PDD-NOS) using an emotion-recognition task. We compared individuals with PDD-NOS or Asperger's disorder (ASP) and typically developing individuals in terms of their ability to recognize facial expressions conveying the six basic emotions. Individuals with PDD-NOS and ASP were worse at recognizing fearful faces than were controls. Individuals with PDD-NOS were less accurate in recognizing disgusted faces than were those with ASP. The results suggest that PDD subtypes are characterized by shared and unique impairments in the ability to recognize facial expressions. Furthermore, the ability to recognize fearful but not disgusted expressions was negatively correlated with the severity of social dysfunction in PDD-NOS and ASP. The results suggest that impaired recognition of fearful and disgusted faces may reflect the severity of social dysfunction across PDD subtypes and the specific problems associated with PDD-NOS, respectively. Characteristics associated with different levels of symptom severity in PDD-NOS are discussed in terms of similarities with brain damage and other psychiatric disorders

    Exaggerated perception of facial expressions is increased in individuals with schizotypal traits.

    Get PDF
    Emotional facial expressions are indispensable communicative tools, and social interactions involving facial expressions are impaired in some psychiatric disorders. Recent studies revealed that the perception of dynamic facial expressions was exaggerated in normal participants, and this exaggerated perception is weakened in autism spectrum disorder (ASD). Based on the notion that ASD and schizophrenia spectrum disorder are at two extremes of the continuum with respect to social impairment, we hypothesized that schizophrenic characteristics would strengthen the exaggerated perception of dynamic facial expressions. To test this hypothesis, we investigated the relationship between the perception of facial expressions and schizotypal traits in a normal population. We presented dynamic and static facial expressions, and asked participants to change an emotional face display to match the perceived final image. The presence of schizotypal traits was positively correlated with the degree of exaggeration for dynamic, as well as static, facial expressions. Among its subscales, the paranoia trait was positively correlated with the exaggerated perception of facial expressions. These results suggest that schizotypal traits, specifically the tendency to over-attribute mental states to others, exaggerate the perception of emotional facial expressions
    corecore